126

11

Randomness and Complexity

If the first mm steps of a Markov process lead from a Subscript jaj to some intermediate state a Subscript iai,

then the probability of the subsequent passage from a Subscript iai to a Subscript kak does not depend on the

manner in which a Subscript iai was reached, that is,

p(m+n)

jk

=

Σ

i

p(m)

ji p(n)

ik ,

(11.5)

where p Subscript j k Superscript left parenthesis n right parenthesisp(n)

jk is the probability of a transition from a Subscript jaj to a Subscript kak in exactly nn steps (this is a

special case of the Chapman–Kolmogorov identity).

If upon repeated application of upper PP the distribution bold aa tends to an unchanging limit

(i.e., an equilibrium set of states) that does not depend on the initial state, the Markov

chain is said to be ergodic, and we can write

lim

r→∞Pr = Q ,

(11.6)

where upper QQ is a matrix with identical rows. 7 Now,

PPn = Pn P = Pn+1 ,

(11.7)

and if upper QQ exists it follows, by letting n right arrow normal infinityn →∞, that

PQ = QP = Q

(11.8)

from which upper QQ (giving the stationary probabilities; i.e., the equilibrium distribution

of bold aa) can be found.

If all the transitions of a Markov chain are equally probable, then there is a

complete absence of constraint; the process is purely random (a zeroth-order chain).

Higher order Markov processes have already been discussed (see Sect. 6.2).

A Markov chain represents an automaton (cf. Sect. 12.1.1) working incessantly.

If the transformations were determinate (i.e., all entries in the transition matrix were

0 or 1), then the automaton would reach an attractor after a finite number of steps.

The nondeterminate transformation can, however, continue indefinitely (although if

any diagonal element is unity, it will get stuck there). If chains are nested inside one

another, one has a hidden Markov model (HMM; see Sect. 17.5.2): suppose that

the transformations accomplished by an automaton are controlled by a parameter

that can take valuesa 1a1 ora 2a2, say. Ifa 1a1 is input, the automaton follows one matrix of

transitions and ifa 2a2 is input, it follows another set. The HMM is created if transitions

betweena 1a1 anda 2a2 are also Markovian. Markov chain Monte Carlo (MCMC) is used

when the number of unknowns is itself an unknown.

One of the difficulties in the use of Markov chains to model processes is to

ensure adequate statistical justification for any conclusions. The problem essentially

concerns the inferences about the transition probabilities that one would like to

7 As for the transition matrix for a zeroth-order chain (i.e., independent trials).